A communications protocol is a formal description of digital message formats and the rules for exchanging those messages in or between computing systems and in telecommunications. Protocols may include signaling, authentication and error detection and correction capabilities. A protocol describes the syntax, semantics, and synchronization of communication and may be implemented in hardware or software, or both.
Contents |
The protocols in human communication are rules about appearance, speaking, listening and understanding. These rules, also called protocols of conversation, represent different layers of communication. They work together to help people communicate successfully. The need for protocols also applies to computing systems. Network engineers have written rules for communication that must be strictly followed for successful host-to-host communication. These rules apply to different layers of sophistication such as which physical connections to use, how hosts listen, how to interrupt, how to terminate communications, which language to use and many others. These rules, or protocols, that work together to ensure successful communication are grouped into what is known as a protocol suite.
The widespread use and expansion of communications protocols has been a prerequisite for the development of the Internet. The Internet Protocol (IP) and the Transmission Control Protocol (TCP) are the most important of these, and the term Internet Protocol Suite, or TCP/IP, refers to a collection of its most used protocols. Most of the communication protocols in use on the Internet are described in the Request for Comments (RFC) documents of the Internet Engineering Task Force (IETF).
Object-oriented programming has extended the use of the term to include the programming protocols available for connections and communication between objects. Protocols fall into many levels of processes and complexity.
Generally, only the simplest protocols are used alone. Most protocols, especially in the context of communications or networking, are layered together into protocol stacks where a variety of tasks are divided among different protocols in the stack.
Whereas the protocol stack denotes a specific combination of protocols that work together, a reference model is a software architecture that defines each layer and the services each should offer. A prominent reference model is the seven-layer OSI model, which is used for conceptualizing protocol stacks and peer entities. This reference model also provides an opportunity to teach more general software engineering concepts such as hiding, modularity, and delegation of tasks. This model has endured in spite of the demise of many of its protocols (and protocol stacks) originally sanctioned by the International Organization for Standardization (ISO). Despite its success as a teaching tool, the design process and the architecture of the Internet do not specifically adhere to the principles of the OSI model, but a general analogy is commonly recognized. Instead, Internet design is based on the Internet Protocol Suite.
Communications protocols are often designed, developed and promulgated by groups of engineering professionals organized by standards organizations, such as the International Organization for Standardization (ISO), the International Telecommunications Union (ITU), the Institute of Electrical and Electronics Engineers (IEEE), or the Internet Engineering Task Force (IETF). The IETF maintains the protocols in use on the Internet. The IEEE controls many software and hardware protocols in the electronics industry for commercial and consumer devices. The ITU is an umbrella organization of telecommunications engineers designing the public switched telephone network (PSTN), as well as many radio communication systems. For marine electronics the NMEA standards are used.
Many communication protocols are also designed by commercial entities, such as small and large corporations in the software and data communications industries. Xerox Corporation (Xerox PARC), Microsoft, Intel, and many others have developed communications protocols that are in wide-spread use. Some were developed initially in proprietary fashion, but were published and were accepted later by the broader public. Some remain in proprietary status.
The IP protocol is the basis for more commonly used protocols:
Many services are based on protocols that use the TCP protocol, like:
In general, protocol testers or protocol analyzers work by capturing the information exchanged between a device under test (DUT) and a reference device known to operate properly. In the example of a manufacturer producing a new keyboard for a personal computer, the DUT would be the keyboard and the reference device, the PC. The information exchanged between the two devices is governed by rules set out in a technical specification called a "communication protocol". Both the nature of the communication and the actual data exchanged are defined by the specification. Since communication protocols are state-dependent (what should happen next depends on what previously happened), specifications are complex and the documents describing them can be hundreds of pages.
The captured information is decoded from raw digital form into a human-readable format that permits users of the protocol tester to easily review the exchanged information. Protocol testers vary in their abilities to display data in multiple views, automatically detect errors, determine the root causes of errors, generate timing diagrams, etc.
Some protocol testers can also generate traffic and thus act as the reference device. Such testers generate protocol-correct traffic for functional testing, and may also have the ability to deliberately introduce errors to test for the DUT's ability to deal with error conditions.
Protocol testing is an essential step towards commercialization of standards-based products. It helps to ensure that products from different manufacturers will operate together properly ("interoperate") and so satisfy customer expectations.